Bayesian Nonparametric Tests via Sliced Inverse Modeling

نویسندگان

  • Bo Jiang
  • Chao Ye
  • Jun S. Liu
چکیده

We study the problem of independence and conditional independence tests between categorical covariates and a continuous response variable, which has an immediate application in genetics. Instead of estimating the conditional distribution of the response given values of covariates, we model the conditional distribution of covariates given the discretized response (aka “slices”). By assigning a prior probability to each possible discretization scheme, we can compute efficiently a Bayes factor (BF)-statistic for the independence (or conditional independence) test using a dynamic programming algorithm. Asymptotic and finite-sample properties such as power and null distribution of the BF statistic are studied, and a stepwise variable selection method based on the BF statistic is further developed. We compare the BF statistic with some existing classical methods and demonstrate its statistical power through extensive simulation studies. We apply the proposed method to a mouse genetics data set aiming to detect quantitative trait loci (QTLs) and obtain promising results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reference curves estimation via Sliced Inverse Regression

In order to obtain reference curves for data sets when the covariate is multidimensional, we propose a new methodology based on dimension-reduction and nonparametric estimation of conditional quantiles. This semiparametric approach combines sliced inverse regression (SIR) and a kernel estimation of conditional quantiles. The convergence of the derived estimator is shown. By a simulation study, ...

متن کامل

Dimension Reduction Based on Canonical Correlation

Dimension reduction is helpful and often necessary in exploring nonlinear or nonparametric regression structures with a large number of predictors. We consider using the canonical variables from the design space whose correlations with a spline basis in the response space are significant. The method can be viewed as a variant of sliced inverse regression (SIR) with simple slicing replaced by Bs...

متن کامل

A note on shrinkage sliced inverse regression

We employ Lasso shrinkage within the context of sufficient dimension reduction to obtain a shrinkage sliced inverse regression estimator, which provides easier interpretations and better prediction accuracy without assuming a parametric model. The shrinkage sliced inverse regression approach can be employed for both single-index and multiple-index models. Simulation studies suggest that the new...

متن کامل

Sliced inverse regression in reference curves estimation

In order to obtain reference curves for data sets when the covariate is multidimensional, we propose in this paper a new procedure based on dimension-reduction and nonparametric estimation of conditional quantiles. This semiparametric approach combines sliced inverse regression (SIR) and a kernel estimation of conditional quantiles. The asymptotic convergence of the derived estimator is shown. ...

متن کامل

An introduction to dimension reduction in nonparametric kernel regression

Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors and a response variable. However, when the number of predictors is high, nonparametric estimators may suffer from the curse of dimensionality. In this chapter, we show how a dimension reduction method (namely Sliced Inverse Regression) can be combined with nonparametric kernel regression to overc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015